Bias/Variance Decompositions for Likelihood-Based Estimators

نویسنده

  • Tom Heskes
چکیده

The bias/variance decomposition of mean-squared error is well understood and relatively straightforward. In this note, a similar simple decomposition is derived, valid for any kind of error measure that, when using the appropriate probability model, can be derived from a Kullback-Leibler divergence or log-likelihood.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimation of Parameters for an Extended Generalized Half Logistic Distribution Based on Complete and Censored Data

This paper considers an Extended Generalized Half Logistic distribution. We derive some properties of this distribution and then we discuss estimation of the distribution parameters by the methods of moments, maximum likelihood and the new method of minimum spacing distance estimator based on complete data. Also, maximum likelihood equations for estimating the parameters based on Type-I and Typ...

متن کامل

Mixture of Normal Mean-Variance of Lindley Distributions

‎Abstract: In this paper, a new mixture modelling using the normal mean-variance mixture of Lindley (NMVL) distribution has been considered. The proposed model is heavy-tailed and multimodal and can be used in dealing with asymmetric data in various theoretic and applied problems. We present a feasible computationally analytical EM algorithm for computing the maximum likelihood estimates. T...

متن کامل

On the efficiency of a semi-parametric GARCH model

Financial time series exhibit time-varying volatilities and non-Gaussian distributions. There has been considerable research on the GARCH models for dealing with these issues related to financial data. Since in practice the true error distribution is unknown, various quasi maximum likelihood methods based on different assumptions on the error distribution have been studied in the literature. Ho...

متن کامل

Biases of the Maximum Likelihood and Cohen- Sackrowitz Estimators for the Tree-order Model

Consider s + 1 univariate normal populations with common variance σ2 and means μi, i = 0, 1, . . . , s, constrained by the tree-order restrictions μ0 ≤ μi, i = 1, 2, . . . , s. For certain sequences μ0, μ1, . . . the maximum likelihood-based estimator (MLBE) of μ0 diverges to −∞ as s → ∞ and its bias is unbounded. By contrast, the bias of an alternative estimator of μ0 proposed by Cohen and Sac...

متن کامل

A New Adjusted Residual Likelihood Method for the Fay-Herriot Small Area Model

In the context of the Fay-Herriot model, a mixed regression model routinely used to combine information from various sources in small area estimation, certain adjustments to a standard likelihood (e.g., profile, residual, etc.) have been recently proposed in order to produce strictly positive and consistent model variance estimators. These adjustments protect the resulting empirical best linear...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 10 6  شماره 

صفحات  -

تاریخ انتشار 1998